93 research outputs found

    May 18th, 2017

    Get PDF
    In the Big Data era, graph processing has been widely used to represent complex system structure, capture data dependency and uncover relationship insights. Due to the ever-growing graph scale and algorithm complexity, several distributed graph processing frameworks have attracted many interests from both academia and industry. In this talk, I will investigate how to achieve the trade-off between performance and cost for large scale graph processing on the Cloud. System-aware and machine learning models are developed to predict the performance of distributed graph processing tasks. Consequently, cost-efficient resource provisioning strategies could be recommended by selecting a certain number of VMs with specified capability subject to the predefined resource price and user preference. At the end of this talk, I will briefly introduce our recent projects on urban computing, disease simulation and social network analytics based on graph processing and real world data

    The Prospect of Enhancing Large-Scale Heterogeneous Federated Learning with Transformers

    Full text link
    Federated learning (FL) addresses data privacy concerns by enabling collaborative training of AI models across distributed data owners. Wide adoption of FL faces the fundamental challenges of data heterogeneity and the large scale of data owners involved. In this paper, we investigate the prospect of Transformer-based FL models for achieving generalization and personalization in this setting. We conduct extensive comparative experiments involving FL with Transformers, ResNet, and personalized ResNet-based FL approaches under various scenarios. These experiments consider varying numbers of data owners to demonstrate Transformers' advantages over deep neural networks in large-scale heterogeneous FL tasks. In addition, we analyze the superior performance of Transformers by comparing the Centered Kernel Alignment (CKA) representation similarity across different layers and FL models to gain insight into the reasons behind their promising capabilities

    RA2: predicting simulation execution time for cloud-based design space explorations

    Get PDF
    Design space exploration refers to the evaluation of implementation alternatives for many engineering and design problems. A popular exploration approach is to run a large number of simulations of the actual system with varying sets of configuration parameters to search for the optimal ones. Due to the potentially huge resource requirements, cloud-based simulation execution strategies should be considered in many cases. In this paper, we look at the issue of running large-scale simulation-based design space exploration problems on commercial Infrastructure-as-a-Service clouds, namely Amazon EC2, Microsoft Azure and Google Compute Engine. To efficiently manage cloud resources used for execution, the key problem would be to accurately predict the running time for each simulation instance in advance. This is not trivial due to the currently wide range of cloud resource types which offer varying levels of performance. In addition, the widespread use of virtualization techniques in most cloud providers often introduces unpredictable performance interference. In this paper, we propose a resource and application-aware (RA2) prediction approach to combat performance variability on clouds. In particular, we employ neural network based techniques coupled with non-intrusive monitoring of resource availability to obtain more accurate predictions. We conducted extensive experiments on commercial cloud platforms using an evacuation planning design problem over a month-long period. The results demonstrate that it is possible to predict simulation execution times in most cases with high accuracy. The experiments also provide some interesting insights on how we should run similar simulation problems on various commercially available clouds

    Time for a Radical Reappraisal of Tourist Decision Making? Toward a New Conceptual Model

    Get PDF
    General models of tourist decision making have been developed to theorize tourist decision processes. These models have been based on the premise that tourists are rational decision makers and utility maximizers. Further, these models have been operationalized through input–output models to measure preferences and behavioral intentions. The extent that they remain viable to explain and predict tourist behavior as tourism markets mature however is uncertain. This review article critiques these approaches and proposes a new general model based on dual system theory to account for different types of choice strategies, the constructive nature of preferences, and to recognize the individual and contextual factors that influence choice processes. The article argues that a general tourist choice model should integrate the psychological processes that determine choice strategies, or heuristics, and consider choice context. These include individual differences, task-related factors, and principles determining system engagement. Future research and practical implications are outlined
    • …
    corecore